On parameters transformations for emulating sparse priors using variational-Laplace inference
نویسنده
چکیده
So-called sparse estimators arise in the context of model fitting, when one a priori assumes that only a few (unknown) model parameters deviate from zero. Sparsity constraints can be useful when the estimation problem is under-determined, i.e. when number of model parameters is much higher than the number of data points. Typically, such constraints are enforced by minimizing the L1 norm, which yields the so-called LASSO estimator. In this work, we propose a simple parameter transform that emulates sparse priors without sacrificing the simplicity and robustness of L2-norm regularization schemes. We show how L1 regularization can be obtained with a"sparsify"remapping of parameters under normal Bayesian priors, and we demonstrate the ensuing variational Laplace approach using Monte-Carlo simulations.
منابع مشابه
A Truncated Variational EM Approach for Spike-and-Slab Sparse Coding
We study inference and learning based on a sparse coding model with ‘spike-and-slab’ prior. As standard sparse coding, the used model assumes independent latent sources that linearly combine to generate data points. However, instead of using a standard sparse prior such as a Laplace distribution, we study the application of a more flexible ‘spike-and-slab’ distribution which models the absence ...
متن کاملA truncated EM approach for spike-and-slab sparse coding
We study inference and learning based on a sparse coding model with ‘spike-and-slab’ prior. As in standard sparse coding, the model used assumes independent latent sources that linearly combine to generate data points. However, instead of using a standard sparse prior such as a Laplace distribution, we study the application of a more flexible ‘spike-and-slab’ distribution which models the absen...
متن کاملBayesian Sparsity for Intractable Distributions
Bayesian approaches for single-variable and group-structured sparsity outperform L1 regularization, but are challenging to apply to large, potentially intractable models. Here we show how noncentered parameterizations, a common trick for improving the efficiency of exact inference in hierarchical models, can similarly improve the accuracy of variational approximations. We develop this with two ...
متن کاملOn Bayesian classification with Laplace priors
We present a new classification approach, using a variational Bayesian estimation of probit regression with Laplace priors. Laplace priors have been previously used extensively as a sparsity inducing mechanism to perform feature selection simultaneously with classification or regression. However, contrarily to the ’myth’ of sparse Bayesian learning with Laplace priors, we find that the sparsity...
متن کاملVariational Bayesian Multinomial Probit Regression with Gaussian Process Priors
It is well known in the statistics literature that augmenting binary and polychotomous response models with gaussian latent variables enables exact Bayesian analysis viaGibbs sampling from the parameter posterior. By adopting such a data augmentation strategy, dispensing with priors over regression coefficients in favor of gaussian process (GP) priors over functions, and employing variational a...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017